Goto

Collaborating Authors

 sex offender


AI chatbots could help stop prisoner release errors, says justice minister

The Guardian

HMP Wandsworth gets green light to use AI after team sent in to find'quick fixes' after spate of mistakes Artificial intelligence chatbots could be used to stop prisoners from being mistakenly released from jail, a justice minister told the House of Lords on Monday. James Timpson said HMP Wandsworth had been given the green light to use AI after a specialised team was sent in to find "some quick fixes". A double manhunt was launched last week after the incorrect release of a sex offender and a fraudster from the prison in south-west London. Release errors over the past fortnight have been seized upon by opposition MPs as evidence of the helplessness of ministers in the face of chaos within the criminal justice system. David Lammy, the justice secretary, is expected to address parliament about the number of missing prisoners when MPs return on Tuesday. It is understood that AI could be used to read and process paper documents; help staff cross-reference names to ensure that inmates are no longer hiding their past crimes behind aliases; merge different datasets; and calculate release dates and sentences.


Sex offender banned from using AI tools in landmark UK case

The Guardian

A sex offender convicted of making more than 1,000 indecent images of children has been banned from using any "AI creating tools" for the next five years in the first known case of its kind. Anthony Dover, 48, was ordered by a UK court "not to use, visit or access" artificial intelligence generation tools without the prior permission of police as a condition of a sexual harm prevention order imposed in February. The ban prohibits him from using tools such as text-to-image generators, which can make lifelike pictures based on a written command, and "nudifying" websites used to make explicit "deepfakes". Dover, who was given a community order and 200 fine, has also been explicitly ordered not to use Stable Diffusion software, which has reportedly been exploited by paedophiles to create hyper-realistic child sexual abuse material, according to records from a sentencing hearing at Poole magistrates court. The case is the latest in a string of prosecutions where AI generation has emerged as an issue and follows months of warnings from charities over the proliferation of AI-generated sexual abuse imagery.


Tinder is losing the tool it uses for background checks

Engadget

The background-checking tool used by Match Group to offer a safety feature for Tinder users is shutting down. The non-profit and female-founded Garbo, which the dating app conglomerate has partnered with since 2019, will shut down its consumer tool at the end of August. "Most tech companies just see trust and safety as good PR," Kathryn Kosmides, Garbo's founder and CEO, told The Wall Street Journal, which published a report on the severed partnership. "I'd rather Garbo shift focus to our other efforts than allow the vision of Garbo to be compromised and relegated to a piece of big corporations' marketing goals." A Match Group spokesperson supplied a statement to Engadget.


Tinder's parent company is auditing its sexual violence prevention policies

Engadget

Tinder parent company Match Group is partnering with one of the largest anti-sexual violence groups in the US to audit how it handles reports of sexual assault across its many dating platforms. The Rape, Abuse & Incest National Network (RAINN) will "conduct a comprehensive review of sexual misconduct reporting, moderation and response across Match Group's dating platforms and to work together to improve current safety systems and tools," the company said on Monday. The first phase of the review will focus on Tinder, Hinge and Plenty of Fish before moving on to Match's other platforms -- the company owns around 40 other dating brands altogether. The partnership, which Axios was the first to report on, will continue through 2021, with recommend changes rolling out "shortly thereafter." This is an important move for Match, even if there aren't many details at the moment. While you frequently hear of horror stories, it's difficult to put an exact number to the incidents of sexual assault that happen through Tinder and other online dating platforms.


Dating apps face U.S. inquiry over underage use and sex offenders

The Japan Times

SAN FRANCISCO – A House subcommittee is investigating popular dating services such as Tinder and Bumble for allegedly allowing minors and sex offenders to use their services. Bumble, Grindr, The Meet Group and the Match Group, which owns such popular services as Tinder, Match.com and OkCupid, are the current targets of the investigation by the U.S. House Oversight and Reform subcommittee on economic and consumer policy. In separate letters Thursday to the companies, the subcommittee is seeking information on users' ages, procedures for verifying ages, and any complaints about assaults, rape or the use of the services by minors. It is also asking for the services' privacy policies and details on what users see when they review and agree to the policies. Although the minimum age for using internet services is typically 13 in the U.S., dating services generally require users to be at least 18 because of concerns about sexual predators.


Dating app Plenty of Fish reveals it leaked private names and zip codes of users

Daily Mail - Science & tech

Researchers discovered the dating app Plenty of Fish was leaking information that users had set to private on their profiles. User's names and zip codes were displayed in the app's API, allowing malicious actors to locate a user's exact location. Although the data was scrambled, experts were able to reveal the information using freely available tools designed to analyze network traffic, as first reported by TechCrunch. The discovery was made by The App Analyst, an expert in digital apps, who found that sensitive data was visible via Plenty of Fish's API on October 20th. A fix was developed and tested on November 5th and on December 18th, it confirmed the sensitive data was no longer present in its API.


Would background checks make dating apps safer?

#artificialintelligence

Match Group, the largest dating app conglomerate in the US, doesn't perform background checks on any of its apps' free users. A ProPublica report today highlights a few incidents in which registered sex offenders went on dates with women who had no idea they were talking to a convicted criminal. These men then raped the women on their dates, leaving the women to report them to the police and to the apps' moderators. These women expected their dating apps to protect them, or at least vet users, only to discover that Match has little to no insight on who's using their apps. The piece walks through individual attacks and argues that the apps have no real case for not vetting their users.


How law enforcement agencies use artificial intelligence to fight crime 7wData

#artificialintelligence

Artificial intelligence (AI) has been on everyone's lips lately, and for good reason. The technology is constantly finding new applications and has already transformed a number of industries, including healthcare, communications, automotive, and financial, with others set to follow in the near future. Given the stakes involved, it may not be particularly surprising that law enforcement has somewhat lagged behind other sectors when it comes to the adoption of artificial intelligence. However, that's slowly starting to change, with law enforcement agencies around the world increasingly turning to AI to help them fight crime. A recent report published by MarketsandMarkets estimates that the global law enforcement software market will grow from $10 billion in 2017 to $18 billion by 2023.


Schools, fearing threats, look to facial recognition technology for additional security

FOX News

In this July 10, 2018 photo, a camera with facial recognition capabilities hangs from a wall while being installed at Lockport High School in Lockport, N.Y. The surveillance system that has kept watch on students entering Lockport schools for over a decade is getting a novel upgrade. Facial recognition technology soon will check each face against a database of expelled students, sex offenders and other possible troublemakers. It could be the start of a trend as more schools fearful of shootings consider adopting the technology, which has been gaining ground on city streets and in some businesses and government agencies. Just last week, Seattle-based digital software company RealNetworks began offering a free version of its facial recognition system to schools nationwide.


The Growing Problem Of Child Sex Dolls And Robots

International Business Times

Sex robots appear to be the next big thing for the adult entertainment industry. Unroboticized sex dolls are not new – but combined with state-of-the-art fabrication techniques, Artificial Intelligence (AI) and programming applications, such dolls may soon reach new levels of sophistication. As sex dolls become increasingly realistic – and their roboticization looms on the horizon – a key question to ask is how the law should respond when such objects are made for, and used by, those with a sexual interest in children? Dolls for this market, manufactured overseas, are now starting to appear on the legal radar from attempts to import them into the country. The National Crime Agency (NCA) has warned that child-like sex dolls are being sold on the internet and campaigners have urged the government to outlaw the trade.